AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Long-text retrieval

# Long-text retrieval

M2 BERT 128 Retrieval Encoder V1
Apache-2.0
M2-BERT-128 is an 80-million-parameter retrieval model checkpoint proposed in the paper 'Benchmarking and Building Long-Context Retrieval Models with LoCo and M2-BERT'
Text Embedding Transformers English
M
hazyresearch
19
3
M2 Bert 80M 2k Retrieval
Apache-2.0
This is an 80M parameter M2-BERT pre-trained checkpoint with a sequence length of 2048, fine-tuned for long-context retrieval tasks.
Text Embedding Transformers English
M
togethercomputer
538
15
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase